Natural Deduction for Intuitionistic Non-Commutative Linear Logic

Natural Deduction for Intuitionistic Non-Commutative Linear Logic, Jeff Polakow and Frank Pfenning. TLCA 1999.

Intuitionistic logic captures functional programming in a logical way, as can be seen from the Curry-Howard isomorphism between constructive proofs and functional programs. However, there are many structural properties of programs that are not captured within the intuitionistic framework, such as resource usage, computational complexity, and sequentiality. Intuitionistic linear logic can be thought of as a refinement of intuitionistic logic in which the resource consumption properties of functions can be expressed internally. Here, we further refine it to allow the expression of sequencing of computations. We achieve this by controlling the use of the structural rule of exchange to arrive at intuitionistic non-commutative linear logic.

My earlier post on linguistics reminded me of the Lambek calculus, which is an ordered logic invented in 1958(!) to model how to parse sentences. So I wanted to find a paper on ordered logic (ie, you can't freely swap the order of hypotheses in a context) and link to that.

On the origins of Bisimulation, Coinduction, and Fixed Points

Davide Sangiorgi, On the origins of Bisimulation, Coinduction, and Fixed Points.

The origins of bisimulation and bisimilarity are examined, in the three fields where they have been independently discovered: Computer Science, Philosophical Logic (precisely, Modal Logic), Set Theory.

Bisimulation and bisimilarity are coinductive notions, and as such are intimately related to fixed points, in particular greatest fixed points. Therefore also the appearance of coinduction and fixed points are discussed, though in this case only within Computer Science. The paper ends with some historical remarks on the main fixed-point theorems (such as Knaster-Tarski) that underpin the fixed-point theory presented.

There is a wealth of interesting information in this paper. Alas, it is not very easy to read, and the exposition can be improved. So this is not for beginners or outsiders, but if you are familiar with the topic the historical discussion will be of interest.

Online Learning of Relaxed CCG Grammars for Parsing to Logical Form

Online Learning of Relaxed CCG Grammars for Parsing to Logical Form, Luke S. Zettlemoyer and Michael Collins. Empirical Methods in Natural Language Processing and Computational Natural Language Learning, 2007.

We consider the problem of learning to parse sentences to lambda-calculus representations of their underlying semantics and present an algorithm that learns a weighted combinatory categorial grammar (CCG). A key idea is to introduce non-standard CCG combinators that relax certain parts of the grammar—for example allowing flexible word order, or insertion of lexical items— with learned costs. We also present a new, online algorithm for inducing a weighted CCG. Results for the approach on ATIS data show 86% F-measure in recovering fully correct semantic analyses and 95.9% F-measure by a partial-match criterion, a more than 5% improvement over the 90.3% partial-match figure reported by He and Young (2006)

This paper isn't exactly PL, though Ehud has been okay with the odd foray into computational linguistics before. I thought it was interesting to see machine learning work make use a typed formalism like categorial grammars to handle long-range dependencies, and it leaves me wondering if it's possible to set these kinds of techniques onto program analysis problems.

One neat thing about the CCG formalism is that you have parsing, which is described more or less as a typechecking problem, and separately you have the semantic constraints, which are basically lambda-calculus terms that build up a term in first-order logic. So I can imagine doing something like writing down how you're supposed to use a library as semantic constraints that add up to a Prolog program, and then at compile time the compiler walks the typing derivation to construct the program, and then runs it to figure out if you're doing something dumb (according to the library designers). I don't know if that actually makes sense, but this work certainly prompted me to think.

Technometria: Google Web Toolkit

Phil Windley Technometria podcast is dedicated to the Google Web Toolkit. The guest on the show is Bruce Johnson a Tech Lead of GWT.

The show is very good, and more technical than usual. Many themes that are near and dear to LtU are discussed. Here are some pointers:

Bruce talks at length about the advantages of compiling from Java to JS, many of which arise from Java's static typing. He mainly talks about optimizations, but also about how static typing helps with tools in general (IDEs etc.). This was a subject of long and stormy debates here in the past.

The advantages, from a software engineering standpoint, of building in Java vs. JS are discussed. This is directly related to the ongoing discusison here on the new programming-in-the-large features added to JS2. I wonder if someone will write a compiler from Java/GWT to JS2 at some point, which will enable projects to move to JS2 and jump ship on Java all together.

Bruce mentions that since JS isn't class-based, and thus doesn't directly support the OO style many people are used to, there are many ways of translating common OO idioms into JS. This is, of course, the same type of dilemma the Scheme community has about many high level features. Cast as a question on OOP support the questions becomes is it better to provide language constructs that allow various libraries to add OO support in different ways, or to provide language support for a specific style. The same can be asked about a variety of features and programming styles, of course.

Finally, Bruce mentions that as far as he knows no one thought about something like GWT before they did. Well, I for one, and I don't think I was the only one, talked many times (probably on LtU) about Javascript as a VM/assembly language of the browser, clearly thinking about JS as a target language. I admint I wasn't thinking aobut compiling Java... But then, I am not into writing Java, so why would I think about Java as the source language...

Gödel, Nagel, minds and machines

Solomon Feferman. Gödel, Nagel, minds and machines. Ernest Nagel Lecture, Columbia University, Sept. 27, 2007.

Just fifty years ago, Ernest Nagel and Kurt Goedel became involved in a serious imbroglio about the possible inclusion of Goedel’s original work on incompleteness in the book, Goedel’s Proof, then being written by Nagel with James R. Newman. What led to the conflict were some unprecedented demands that Goedel made over the use of his material and his involvement in the contents of the book - demands that resulted in an explosive reaction on Nagel’s part. In the end the proposal came to naught. But the story is of interest because of what was basically at issue, namely their provocative related but contrasting views on the possible significance of Goedel’s theorems for minds vs. machines in the development of mathematics.

This is not directly PLT related, and more philosophical than what we usually discuss on LtU, but I think it will be of interest to some members of the community.

While the historical details are interesting, I am not sure I agree with the analysis. It would be interesting to here what others make of this.

To make this item slightly more relevant to LtU, let me point out that both the LC and category theory are mentioned (although they are really discussed only in the references).

On One-Pass CPS Transformations

Olivier Danvy, Kevin Millikin and Lasse R. Nielsen. On One-Pass CPS Transformations. March 2007.

We bridge two distinct approaches to one-pass CPS transformations, i.e., CPS transformations that reduce administrative redexes at transformation time instead of in a post-processing phase. One approach is compositional and higher-order, and is independently due to Appel, Danvy and Filinski, and Wand, building on Plotkin's seminal work. The other is non-compositional and based on a reduction semantics for the lambda-calculus, and is due to Sabry and Felleisen. To relate the two approaches, we use three tools: Reynolds's defunctionalization and its left inverse, refunctionalization; a special case of fold-unfold fusion due to Ohori and Sasano, fixed-point promotion; and an implementation technique for reduction semantics due to Danvy and Nielsen, refocusing.

This work is directly applicable to transforming programs into monadic normal form.

Also in JFP 17:793-812 (2007).

ECMAScript 4 overview paper

An overview paper describing ECMAScript 4 has been added to the ECMAScript site. It was recently announced on the mailing list:

I'm pleased to present you with an overview paper describing ES4 as the language currently stands. TG1 is no longer accepting proposals, we're working on the ES4 reference implementation, and we're expecting the standard to be finished in October 2008.
...
This paper is not a spec, it is *just* a detailed overview. Some features may be cut, others may be changed, and numerous details remain to be worked out, but by and large this is what TG1 expects the language to look like. Your comments on the paper are very much welcome. Please send bug reports directly to me, everything else to the list.

Engineering Software Correctness

Rex Page, Engineering software correctness, Proceedings of the ACMS,PLAN 2005 Workshop on Functional and Declarative Programming in Education, September 25, 2005.

Software engineering courses offer one of many opportunities for
providing students with a significant experience in declarative
programming. This report discusses some results from taking
advantage of this opportunity in a two-semester sequence of
software engineering courses for students in their final year of
baccalaureate studies in computer science. The sequence is based
on functional programming using ACL2, a purely functional
subset of Common Lisp with a built-in, computational logic
developed by J Strother Moore and his colleagues over the past
three decades...

A JFP educational pearl with the same title and a similar abstract appears in Journal of Functional Programming (2007), 17: 675-68, but I haven't managed to access it yet.


I am still in the process of finding furniture etc. But at least I have an apartment now... Thanks for all the help and tips, guys!

The End of an Architectural Era (It’s Time for a Complete Rewrite)

The End of an Architectural Era (It’s Time for a Complete Rewrite). Michael Stonebraker, Samuel Madden, Daniel J. Abadi, Stavros Harizopoulos, Nabil Hachem, Pat Helland. VLDB 2007.

A not directly PL-related paper about a new database architecture, but the authors provide some interesting and possibly controversial perspectives:

  • They split the application into per-core, single-threaded instances without any communication between them.
  • Instead of using SQL from an external (web app) process to communicate with the database, they envision embedding Ruby on Rails directly into the database.
  • They state that most database warehouse tasks rely on pre-canned queries only, so there is no need for ad-hoc querying.

The somewhat performance-focused abstract:

In two previous papers some of us predicted the end of "one size fits all" as a commercial relational DBMS paradigm. These papers presented reasons and experimental evidence that showed that the major RDBMS vendors can be outperformed by 1-2 orders of magnitude by specialized engines in the data warehouse, stream processing, text, and scientific data base markets.

Assuming that specialized engines dominate these markets over time, the current relational DBMS code lines will be left with the business data processing (OLTP) market and hybrid markets where more than one kind of capability is required. In this paper we show that current RDBMSs can be beaten by nearly two orders of magnitude in the OLTP market as well. The experimental evidence comes from comparing a new OLTP prototype, H-Store, which we have built at M.I.T. to one of the popular RDBMSs on the standard transactional benchmark, TPC-C.

We conclude that the current RDBMS code lines, while attempting to be a "one size fits all" solution, in fact, excel at nothing. Hence, they are 25 year old legacy code lines that should be retired in favor of a collection of "from scratch" specialized engines. The DBMS vendors (and the research community) should start with a clean sheet of paper and design systems for tomorrow's requirements, not continue to push code lines and architectures designed for the yesterday's requirements.

A critical comment by Amazon's CTO, Werner Vogels.

Privacy and Contextual Integrity: Framework and Applications

Privacy and Contextual Integrity: Framework and Applications, A. Barth, A. Datta, J.C. Mitchell, and H. Nissenbaum. Proceedings of the IEEE Symposium on Security and Privacy, May 2006.

Contextual integrity is a conceptual framework for understanding privacy expectations and their implications developed in the literature on law, public policy, and political philosophy. We formalize some aspects of contextual integrity in a logical framework for expressing and reasoning about norms of transmission of personal information. In comparison with access control and privacy policy frameworks such as RBAC, EPAL, and P3P, these norms focus on who personal information is about, how it is transmitted, and past and future actions by both the subject and the users of the information. Norms can be positive or negative depending on whether they refer to actions that are allowed or disallowed. Our model is expressive enough to capture naturally many notions of privacy found in legislation, including those found in HIPAA, COPPA, and GLBA.

A number of important problems regarding compliance with privacy norms, future requirements associated with specific actions, and relations between policies and legal standards reduce to standard decision procedures for temporal logic.

Contextual integrity is a part of a philosophical theory of privacy developed by the philosopher Helen Nissenbaum, and it's very neat to see it being applied to develop machine-checkable access-control formalisms.